Sarah Guldentops's profile

Integration project - Interactive visualizations

Integration project - Interactive visualisations
This was my last integration project in my bachelor Multimedia and communication technology. I worked in a group of six people during three weeks. The client for this project was our own school and we had to put together a show where all the integration projects would be showcased. 

The theme of the project was disruption, a theme our school would use for the next academic year. Based on that theme my group decided that we wanted to use something very personal of a person, namely its own body and its physical reactions to outside stimuli. We wanted to give out a message that not only can your phone or laptop be hacked in order to find out information about you, but your own physical responses can be used to collect information without you knowing about it. 

At first we had the idea to give everyone in the audience a bracelet with wearable sensors like a pulse sensor, temperature sensor and accelerometer. The data we would gather from all the bracelets could then be used to generate custom visuals or to alter projections during the show. If possible we would also use the data to alter the spots used to light the stage. After discussing this idea with our professors we realised that it would be very difficult to develop enough bracelets in just the two week we had left at that time. We also learned that it would not be easy to gather the data from all these bracelets. 

We then decided to downsize our idea by only tracking one person but in a big way. The idea here was to put one person on stage in a chair attached to a variety of sensors and cables. We would use these sensors to track the person's feelings and reactions during the show. The data from the sensors would be used to generate custom visuals. 

The end result can be seen below. We built the chair ourselves using an existing project and a CNC cutter. To track someone we used a pulse sensor, an accelerometer and a sweat sensor, which were incorporated into a glove. We used Arduino to program the sensors and attached the entire system to a laptop to retrieve the data. Besides the glove we also added a fake breathing sensor an a strap to put around the person's waist and a cap with thick wires. This to give the person even more the feeling of being monitored. Lastly we programmed our own visuals using c++ and Xcode. 

Below are a picture of our chair and our setup. 
The poster below gives an overview of 100 possible visualisations. Depending of a person's heartbeat, temperature and movement the visualisation changes. The ones on this poster were hard coded and not based on real data. 
Integration project - Interactive visualizations
Published:

Integration project - Interactive visualizations

Published: